coefficient of agreement

coefficient of agreement
коэффициент согласия

Англо-русский словарь по авиационной медицине. 2013.

Игры ⚽ Нужна курсовая?

Смотреть что такое "coefficient of agreement" в других словарях:

  • coefficient of concordance — noun a coefficient of agreement (concordance) between different sets of rank orderings of the same set of things • Topics: ↑statistics • Hypernyms: ↑Kendall test …   Useful english dictionary

  • Coefficient of determination — In statistics, the coefficient of determination R2 is used in the context of statistical models whose main purpose is the prediction of future outcomes on the basis of other related information. It is the proportion of variability in a data set… …   Wikipedia

  • Kendall tau rank correlation coefficient — The Kendall tau rank correlation coefficient (or simply the Kendall tau coefficient, Kendall s tau; or tau test(s)) is a non parametric statistic used to measure the degree of correspondence between two rankings and assessing the significance of… …   Wikipedia

  • Concordance correlation coefficient — In statistics, the concordance correlation coefficient measures the agreement between two variables, e.g., to evaluate reproducibility or for inter rater reliability. Definition Lawrence Lin has the form of the concordance correlation coefficient …   Wikipedia

  • Phi coefficient — In statistics, the phi coefficient (also referred to as the mean square contingency coefficient and denoted by φ or rφ) is a measure of association for two binary variables introduced by Karl Pearson[1]. This measure is similar to the Pearson… …   Wikipedia

  • tau coefficient of correlation — noun a nonparametric measure of the agreement between two rankings • Syn: ↑Kendall s tau, ↑Kendall rank correlation • Topics: ↑statistics • Hypernyms: ↑Kendall test …   Useful english dictionary

  • Inter-rater reliability — In statistics, inter rater reliability, inter rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the… …   Wikipedia

  • Cohen's kappa — coefficient is a statistical measure of inter rater agreement or inter annotator agreement[1] for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into… …   Wikipedia

  • Cohens Kappa — ist ein statistisches Maß für die Interrater Reliabilität von Einschätzungen von (in der Regel) zwei Beurteilern (Ratern), das Jacob Cohen 1960 vorschlug. Dieses Maß kann aber auch für die Intrarater Reliabiliät verwendet werden, bei dem derselbe …   Deutsch Wikipedia

  • Fleiss' Kappa — Cohens Kappa ist ein statistisches Maß für die Interrater Reliabilität von Einschätzungen von (in der Regel) zwei Beurteilern (Ratern), das Jacob Cohen 1960 vorschlug. Die Gleichung für Cohens Kappa lautet wobei p0 der gemessene… …   Deutsch Wikipedia

  • Fleiss' kappa — Cohens Kappa ist ein statistisches Maß für die Interrater Reliabilität von Einschätzungen von (in der Regel) zwei Beurteilern (Ratern), das Jacob Cohen 1960 vorschlug. Die Gleichung für Cohens Kappa lautet wobei p0 der gemessene… …   Deutsch Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»